The instrumental temperature record shows fluctuations of the temperature of the global land surface and oceans. This data is collected from several thousand meteorological stations, Antarctic research stations and satellite observations of sea-surface temperature. Currently, the longest-running temperature record is the Central England temperature data series, that starts in 1659. The longest-running quasi-global record starts in 1850.[1]
Contents |
Currently, the Hadley Centre maintains the HADCRUT3, a monthly-mean global surface temperature analysis,[2] and NASA maintains GISTEMP, another monthly-mean global surface temperature analysis, for the period since 1880.[3] The two analyses differ in the details of how they obtain temperature values on a regular grid from the network of irregularly spaced observation sites; thus, their results for global and regional temperature differ slightly. The United States National Oceanic and Atmospheric Administration (NOAA) maintains the Global Historical Climatology Network (GHCN-Monthly) data base contains historical temperature, precipitation, and pressure data for thousands of land stations worldwide.[4] Also, NOAA's National Climatic Data Center (NCDC), which has "the world's largest active archive"[5] of surface temperature measurements, maintains a global temperature record since 1880.[6]
The period for which reasonably reliable instrumental records of near-surface temperature exist with quasi-global coverage is generally considered to begin around 1850. Earlier records exist, but with sparser coverage and less standardized instrumentation.
The temperature data for the record come from measurements from land stations and ships. On land, temperature sensors are kept in a Stevenson screen or a maximum minimum temperature system (MMTS). The sea record consists of surface ships taking sea temperature measurements from engine inlets or buckets. The land and marine records can be compared.[7] Land and sea measurement and instrument calibration is the responsibility of national meteorological services. Standardization of methods is organized through the World Meteorological Organization and its predecessor, the International Meteorological Organization.[8]
Currently, most meteorological observations are taken for use in weather forecasts. Centers such as ECMWF show instantaneous map of their coverage; or the Hadley Centre show the coverage for the average of the year 2000. Coverage for earlier in the 20th and 19th centuries would be significantly less. While temperature changes vary both in size and direction from one location to another, the numbers from different locations are combined to produce an estimate of a global average change.
There are concerns about possible uncertainties in the instrumental temperature record including the fraction of the globe covered, the effects of changing thermometer designs and observing practices, and the effects of changing land-use around the observing stations. The ocean temperature record too suffers from changing practices (such as the switch from collecting water in canvas buckets to measuring the temperature from engine intakes[9]) but they are immune to the urban heat island effect or to changes in local land use/land cover (LULC) at the land surface station.
Most of the observed warming occurred during two periods: 1910 to 1945 and 1976 to 2000; the cooling/plateau from 1945 to 1976 has been mostly attributed to sulphate aerosol.[10] However, a study in 2008 suggests that the temperature drop of about 0.3°C in 1945 could be the apparent result of uncorrected instrumental biases in the sea surface temperature record.[9] Attribution of the temperature change to natural or anthropogenic factors is an important question: see global warming and attribution of recent climate change.
Land and sea measurements independently show much the same warming since 1860.[11] The data from these stations show an average surface temperature increase of about 0.74 °C during the last 100 years. The Intergovernmental Panel on Climate Change (IPCC) stated in its Fourth Assessment Report (AR4) that the temperature rise over the 100 year period from 1906–2005 was 0.74 °C [0.56 to 0.92 °C] with a confidence interval of 90%.
For the last 50 years, the linear warming trend has been 0.13 °C [0.10 to 0.16 °C] per decade according to AR4.
The U.S. National Academy of Sciences, both in its 2002 report to President George W. Bush, and in later publications, has strongly endorsed evidence of an average global temperature increase in the 20th century.[12]
The IPCC Fourth Assessment Report found that the instrumental temperature record for the past century included urban heat island effects but that these were primarily local, having a negligible influence on global temperature trends (less than 0.006 °C per decade over land and zero over the oceans).
The preliminary results of an independent assessment, carried out by the Berkeley Earth Surface Temperature group and made public in October 2011, found that over the past 50 years the land surface warmed by 0.911°C, and their results mirrors those obtained from earlier studies carried out by the NOAA, the Hadley Centre and NASA's GISS. The study addressed scientific concerns raised by skeptics including urban heat island effect, poor station quality, and the risk of data selection bias and found that these effects did not bias the results obtained from these earlier studies.[13][14][15][16]
For more information about the effects or otherwise of urbanization on the temperature record, see the main article: Urban heat island effect
The global temperature changes are not uniform over the globe, nor would they be expected to be, whether the changes were naturally or humanly forced.
Temperature trends from 1901 are positive over most of the world's surface except for Atlantic Ocean south of Greenland, the south-eastern USA and parts of Bolivia. Warming is strongest over interior land area in Asia and North America as well as south-eastern Brazil and some area in the South Atlantic and Indian Oceans.
Since 1979 temperatures increase is considerably stronger over land while cooling has been observed over some oceanic regions in the Pacific Ocean and Southern Hemisphere, the spatial pattern of ocean temperature trend in those regions is possibly related to the Pacific Decadal Oscillation and Southern Anular Mode.[17]
Seasonal temperature trends are positive over most of the globe but weak cooling is observed over the mid latitudes of the southern ocean but also over eastern Canada in spring due to strengthening of the North Atlantic Oscillation, warming is stronger over northern Europe, China and North America in winter, Europe and Asia interior in spring, Europe and north Africa in summer and northern North America, Greenland and Eastern Asia in autumn. Enhanced warming over north Eurasia is partly linked to the Northern Anular Mode,[18][19] while in the southern hemisphere the trend toward stronger westerlies over the southern ocean favoured a cooling over much of Antarctica with the exception of the Antarctic Peninsula where strong westerlies decrease cold air outbreak from the south.[20] The Antarctic Peninsula has warmed by 2.5 °C (4.5 °F) in the past five decades at Bellingshausen Station.[21]
Deriving a reliable global temperature from the instrument data is not easy because the instruments are not evenly distributed across the planet, the hardware and observing locations have changed over the years, and there has been extensive land use change (such as urbanization) around some of the sites.
The calculation needs to filter out the changes that have occurred over time that are not climate related (e.g. urban heat islands), then interpolate across regions where instrument data has historically been sparse (e.g. in the southern hemisphere and at sea), before an average can be taken.
There are three main datasets showing analyses of global temperatures, all developed since the late 1970s: the HadCRUT analysis is compiled in a collaboration between the University of East Anglia's Climatic Research Unit and the Hadley Centre for Climate Prediction and Research,[3][4], independent analyses largely based on the same raw data are produced using different levels of interpolation by the Goddard Institute for Space Studies and by the National Climatic Data Center.[22] These datasets are updated on a monthly basis and are generally in close agreement.
In the late 1990s, the Goddard team used the same data to produce a global map of temperature anomalies to illustrate the difference between the current temperature and average temperatures prior to 1950 across every part of the globe.[23]
In September 2007, the GISTEMP software which is used to process the GISS version of the historical instrument data was made public. The software that was released has been developed over more than 20 years by numerous staff and is mostly in FORTRAN; large parts of it were developed in the 1980s before massive amounts of computer memory was available as well as modern programming languages and techniques.
Two recent open source projects have been developed by individuals to re-write the processing software in modern open code. One, http://www.opentemp.org/, was by John van Vliet. More recently, a project which began in April 2008 (Clear Climate Code) by staff of Ravenbrook Ltd to update the code to Python has so far detected two minor bugs in the original software which did not significantly change any results.[24]
A number of scientists and scientific organizations have expressed concern about the possible deterioration of the land surface observing network.[25][26][27][28] Climate scientist Roger A. Pielke has stated that he has identified a number of sites where poorly sited stations in sparse regions "will introduce spatially unrepresentative data into the analyses."[29] University of Alabama-Huntsville professor of atmospheric science and former IPCC lead author John Christy has stated that "[t]he temperature records cannot be relied on as indicators of global change."[30] The metadata needed to quantify the uncertainty from poorly sited stations does not currently exist. Pielke has called for a similar documentation effort for the rest of the world.[31]
The uncertainty in annual measurements of the global average temperature (95% range) is estimated to be ≈0.05°C since 1950 and as much as ≈0.15°C in the earliest portions of the instrumental record. The error in recent years is dominated by the incomplete coverage of existing temperature records. Early records also have a substantial uncertainty driven by systematic concerns over the accuracy of sea surface temperature measurements.[32][33] Station densities are highest in the northern hemisphere, providing more confidence in climate trends in this region. Station densities are far lower in other regions such as the tropics, northern Asia and the former Soviet Union. This results in less confidence in the robustness of climate trends in these areas. If a region with few stations includes a poor quality station, the impact on global temperature would be greater than in a grid with many weather stations.[34]
In 1999 a panel of the U.S. National Research Council studied the state of US climate observing systems.[35] The panel evaluated many climate measurement aspects, 4 of which had to do with temperature, against ten climate monitoring principles proposed by Karl et al. 1995. Land surface temperature had "known serious deficiencies" in 5 principles, vertical distribution and sea surface in 9 and subsurface ocean in 7.
The U.S. National Weather Service Cooperative Observer Program has established minimum standards regarding the instrumentation, siting, and reporting of surface temperature stations.[36] The observing systems available are able to detect year-to-year temperature variations such as those caused by El Niño or volcanic eruptions.[37] These stations can undergo undocumented changes such as relocation, changes in instrumentation and exposure (including changes in nearby thermally emitting structures), changes in land use (e.g., urbanization), and changes in observation practices. All of these changes can introduce biases into the stations' long term records. In the past, these local biases were generally considered to be random and therefore would cancel each other out using many stations and the ocean record.[37]
A 2006 paper analyzed a subset of U.S. surface stations, 366 stations, and found that 95% displayed a warming trend after land use/land cover (LULC) changes. The authors stated "this does not necessarily imply that the LULC changes are the causative factor."[38] Another study [39] has documented examples of well and poorly sited monitoring stations in the United States, including ones near buildings, roadways, and air conditioning exhausts. Brooks investigated Historical Climate Network (USHCN) sites in Indiana, and assigned 16% of the sites an ‘excellent’ rating, 59% a ‘good’ rating, 12.5% a ‘fair’ rating, and 12.5% ‘poor’ rating.[40] Davey and Pielke visited 10 HCN sites in Eastern Colorado, but did not provide percentages of good or badly sited stations. They stated that some of the sites "are not at all representative of their surrounding region" and should be replaced in the instrumental temperature records with other sites from the U.S. cooperative observer network.[41]
Peterson has argued that existing empirical techniques for validating the local and regional consistency of temperature data are adequate to identify and remove biases from station records, and that such corrections allow information about long-term trends to be preserved.[42] Pielke and co-authors disagree.[43]
The list of warmest years on record is dominated by years from this millennium; each of the last 10 years (2001–2010) features as one of the 11 warmest on record. Although the NCDC temperature record begins in 1880, less accurate reconstructions of earlier temperatures suggest these years may be the warmest for several centuries to millennia.
Year | Global[44] | Land[45] | Ocean[46] |
---|---|---|---|
2005 | 0.6183 | 0.9593 | 0.4896 |
2010 | 0.6171 | 0.9642 | 0.4885 |
1998 | 0.5984 | 0.8320 | 0.5090 |
2003 | 0.5832 | 0.7735 | 0.5108 |
2002 | 0.5762 | 0.8318 | 0.4798 |
2006 | 0.5623 | 0.8158 | 0.4669 |
2009 | 0.5591 | 0.7595 | 0.4848 |
2007 | 0.5509 | 0.9852 | 0.3900 |
2004 | 0.5441 | 0.7115 | 0.4819 |
2001 | 0.5188 | 0.7207 | 0.4419 |
2008 | 0.4842 | 0.7801 | 0.3745 |
1997 | 0.4799 | 0.5583 | 0.4502 |
1999 | 0.4210 | 0.6759 | 0.3240 |
1995 | 0.4097 | 0.6533 | 0.3196 |
2000 | 0.3899 | 0.5174 | 0.3409 |
1990 | 0.3879 | 0.5479 | 0.3283 |
1991 | 0.3380 | 0.4087 | 0.3110 |
1988 | 0.3028 | 0.4192 | 0.2595 |
1987 | 0.2991 | 0.2959 | 0.3005 |
1994 | 0.2954 | 0.3604 | 0.2704 |
1983 | 0.2839 | 0.3715 | 0.2513 |
The values in the table above are anomalies from the 1901–2000 global mean of 13.9°C.[47] For instance, the +0.55°C anomaly in 2007 added to the 1901–2000 mean of 13.9°C gives a global average temperature of 14.45 °C (58.00 °F) for 2007.[48]
The coolest year in the record was 1911.[44]
Numerous cycles have been found to influence annual global mean temperatures. The tropical El Niño-La Niña cycle and the Pacific Decadal Oscillation are the most well-known of these cycles.[49] An examination of the average global temperature changes by decades reveals continuing climate change.[50] Following chart is from NASA data of combined land-surface air and sea-surface water temperature anomalies.
Years | Temp. anomaly (°C anomaly (°F anomaly) from 1951–1980 mean) |
---|---|
1880–1889 | −0.274 °C (−0.493 °F) |
1890–1899 | −0.254 °C (−0.457 °F) |
1900–1909 | −0.259 °C (−0.466 °F) |
1910–1919 | −0.276 °C (−0.497 °F) |
1920–1929 | −0.175 °C (−0.315 °F) |
1930–1939 | −0.043 °C (−0.0774 °F) |
1940–1949 | 0.035 °C (0.0630 °F) |
1950–1959 | −0.02 °C (−0.0360 °F) |
1960–1969 | −0.014 °C (−0.0252 °F) |
1970–1979 | −0.001 °C (−0.00180 °F) |
1980–1989 | 0.176 °C (0.317 °F) |
1990–1999 | 0.313 °C (0.563 °F) |
2000–2009 | 0.513 °C (0.923 °F) |
|